Brought to you by EarthWeb
IT Library Logo

Click Here!
Click Here!


Search the site:
 
EXPERT SEARCH -----
Programming Languages
Databases
Security
Web Services
Network Services
Middleware
Components
Operating Systems
User Interfaces
Groupware & Collaboration
Content Management
Productivity Applications
Hardware
Fun & Games

EarthWeb Direct EarthWeb Direct Fatbrain Auctions Support Source Answers

EarthWeb sites
Crossnodes
Datamation
Developer.com
DICE
EarthWeb.com
EarthWeb Direct
ERP Hub
Gamelan
GoCertify.com
HTMLGoodies
Intranet Journal
IT Knowledge
IT Library
JavaGoodies
JARS
JavaScripts.com
open source IT
RoadCoders
Y2K Info

Previous Table of Contents Next


6-5
Simulating Client/Server Performance

FREDERICK W. SCHOLL

Today’s networks carry more business-critical applications than ever. Although the mainframe is still the workhorse for transaction-processing and financial applications, more firms are using client/server architectures for these needs. One major advantage of the mainframe environment is the well-established performance guarantees that come with this technology. End users are demanding the same performance guarantees from client/server technology. Network modeling and simulation can help network managers design end-user performance requirements into their network infrastructures.

MATCHING SOLUTIONS TO APPLICATION REQUIREMENTS

Although desktop PCs have distributed computing power throughout the organization, more recent trends have illustrated a growing consolidation. Firms are moving servers into common areas to reduce management and upkeep costs. Data storage with backup naturally lends itself to centralized configurations. The servers and their operating systems are improving to the point where large-scale, enterprisewide, business critical applications can be supported. Managing the performance of this open, client/server environment is becoming more difficult.

For each end-user application need, a wide range of technology solutions exists. An enormous range of costs are part of every technology decision. It is easy to overspend or waste money on unproductive technology. For example, monthly private line costs for a 56K-bps line are about $600. For a similar T3 line, costs are approximately $42,000. Furthermore, desktop Pentium machines can easily be obtained for under $2,000. The issue is how to match the technology solutions to the application requirements. This chapter discusses application and network performance improvements and how to obtain them using modeling and simulation.

MANAGING PERFORMANCE IN A CHANGING INFORMATION LANDSCAPE

Change characterizes today’s network environment. The challenge is to manage performance in the face of constant change in network infrastructure, technology, and applications. End-user needs must be met despite reduced IS budgets. This quality of service can be accomplished only with the use of automated network data-gathering and modeling tools that create a computer model of the corporate information infrastructure. A simulation model—or virtual laboratory—permits the network or technology manager to visualize the effects of change before implementation.

Six forces are altering the information landscape and making performance management more difficult:

  Server consolidation.
  Remote office communication.
  Business reengineering.
  Powerful desktop machines.
  New networking technology.
  Multimedia applications.

Server Consolidation. The increasing power of multiprocessor servers has made it possible to centralize LAN data and application servers. In this new glasshouse, the performance of the connecting network pipes and the servers becomes extremely critical.

Remote Office Communication. Business thinking today says, “Put the employees where the customers are.” This directive includes not only sales and marketing, but also engineering and research staff. For some firms, the employee office is wherever the local network communications port is found. Connectivity of headquarters to remote offices has become an important business need.

Business Reengineering. The growth of remote offices is one example of business reengineering—technology is used to simplify and change work processes. Client/server applications have proliferated as processes are moved off older mainframe computers and onto the desktop. New applications can create significant growth in network traffic. For example, by adopting scanning technology, some firms have been able to reduce office paperwork; however, a simple 8 H- by 11-inch letter generates about 50KB of network traffic.

Powerful Desktop Machines. Most people are accustomed to the faster speeds in computer technology. As performance has improved, files have gotten larger, and network I/O has increased. Users are transitioning from 16-bit operating systems to 32-bit systems like OS/2, Windows ’95, and Windows NT. Processors are being upgraded from 100 MIP Pentium machines to 300 MIP P6 chips, and arrays of these chips are being implemented in multiprocessor servers.

New Networking Technology. New networking technologies appear every six months. A few years ago, network managers had to choose between Ethernet and Token Ring LANs. Then fiber distributed data interface (FDDI) came along. Since then, many new network technologies have been introduced—including asynchronous transfer mode (ATM), switched Token Ring, Ethernet, 100M-bps Ethernet, and frame relay.

Without simulation and modeling, it is impossible to make an intelligent choice of how (or whether) to upgrade network technology. It is easy to make the mistake of installing an overdesigned and overpriced network.

Multimedia Applications. Although desktop multimedia is not commonly produced yet, it will change the ground rules for network data transmission. Network error rates are less important than end-to-end latency. For example, for successful voice transmission in a videoconference, end-to-end latency must be less than 200 milliseconds. However, error rates can be 1 × 10-5 without harming the quality of the perceived information. For data applications, latencies can typically last one to three seconds, but end-to-end error rates must be zero.

FOUR STEPS TO IMPROVING NETWORK PERFORMANCE

Network managers often find themselves faced with short deadlines and demanding requirements for rolling out newer client/server applications or improving network performance. Simulation of the new environment helps ensure application response time and the continuing need to redevelop applications or redesign the network. Although the latter process has become common in IS departments, it does not have to continue. This section outlines four steps for simulating network environments to improve network performance (see Exhibit 6-5-1).


Exhibit 6-5-1.  Client/Server Performance Management Four Step Process


Previous Table of Contents Next

footer nav
Use of this site is subject certain Terms & Conditions.
Copyright (c) 1996-1999 EarthWeb, Inc.. All rights reserved. Reproduction in whole or in part in any form or medium without express written permission of EarthWeb is prohibited. Please read our privacy policy for details.